Maximum entropy principle and power-law tailed distributions

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Superstatistical distributions from a maximum entropy principle.

We deal with a generalized statistical description of nonequilibrium complex systems based on least biased distributions given some prior information. A maximum entropy principle is introduced that allows for the determination of the distribution of the fluctuating intensive parameter beta of a superstatistical system, given certain constraints on the complex system under consideration. We appl...

متن کامل

Projective Power Entropy and Maximum Tsallis Entropy Distributions

We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterizati...

متن کامل

Kirchhoff's loop law and the maximum entropy production principle.

In contrast to the standard derivation of Kirchhoff's loop law, which invokes electric potential, we show, for the linear planar electric network in a stationary state at the fixed temperature, that loop law can be derived from the maximum entropy production principle. This means that the currents in network branches are distributed in such a way as to achieve the state of maximum entropy produ...

متن کامل

Probability Distributions and Maximum Entropy

If we want to assign probabilities to an event, and see no reason for one outcome to occur more often than any other, then the events are assigned equal probabilities. This is called the principle of insufficient reason, or principle of indifference, and goes back to Laplace. If we happen to know (or learn) something about the non-uniformity of the outcomes, how should the assignment of probabi...

متن کامل

Power Law and Entropy

Shannon[1] and Yavuz[2] estimated the entropy of real documents. This note drives an upper bound of entropy from the power law. Let D be a set of documents and f r(w) be the number of occurrences of a word w in the document. Given an interger k, we denote by F (k) the number of words whose frequency is k, i.e., F (k) = ♯{w ∈ D | f r(w) = k}. The entropy of the document set D is defined by E(D) ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The European Physical Journal B

سال: 2009

ISSN: 1434-6028,1434-6036

DOI: 10.1140/epjb/e2009-00161-0